Optimally weighted loss functions for solving PDEs with Neural Networks

نویسندگان

چکیده

Recent works have shown that deep neural networks can be employed to solve partial differential equations, giving rise the framework of physics informed (Raissi et al., 2007). We introduce a generalization for these methods manifests as scaling parameter which balances relative importance different constraints imposed by equations. A mathematical motivation generalized is provided, shows linear and well-posed functional form convex. then derive choice optimal with respect measure error. Because this relies on having full knowledge analytical solutions, we also propose heuristic method approximate choice. The proposed are compared numerically original variety model number data points being updated adaptively. For several problems, including high-dimensional PDEs significantly enhance accuracy.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Solving PDEs with Radial Basis Functions

Finite differences was the first numerical approach that permitted large-scale simulations in many applications areas, such as geophysical fluid dynamics. As accuracy and integration time requirements gradually increased, the focus shifted from finite differences to a variety of different spectral methods. During the last few years, radial basis functions, in particular in their ‘local’ RBF-FD ...

متن کامل

Solving for high dimensional committor functions using artificial neural networks

In this note we propose a method based on artificial neural network to study the transition between states governed by stochastic processes. In particular, we aim for numerical schemes for the committor function, the central object of transition path theory, which satisfies a high-dimensional Fokker-Planck equation. By working with the variational formulation of such partial differential equati...

متن کامل

Solving PDEs with Intrepid

Intrepid is a Trilinos package for advanced discretizations of Partial Differential Equations (PDEs). The package provides a comprehensive set of tools for local, cell-based construction of a wide range of numerical methods for PDEs. This paper describes the mathematical ideas and software design principles incorporated in the package. We also provide representative examples showcasing the use ...

متن کامل

On Loss Functions for Deep Neural Networks in Classification

Deep neural networks are currently among the most commonly used classifiers. Despite easily achieving very good performance, one of the best selling points of these models is their modular design – one can conveniently adapt their architecture to specific needs, change connectivity patterns, attach specialised layers, experiment with a large amount of activation functions, normalisation schemes...

متن کامل

Robust Loss Functions under Label Noise for Deep Neural Networks

In many applications of classifier learning, training data suffers from label noise. Deep networks are learned using huge training data where the problem of noisy labels is particularly relevant. The current techniques proposed for learning deep networks under label noise focus on modifying the network architecture and on algorithms for estimating true labels from noisy labels. An alternate app...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational and Applied Mathematics

سال: 2022

ISSN: ['0377-0427', '1879-1778', '0771-050X']

DOI: https://doi.org/10.1016/j.cam.2021.113887